47 research outputs found
Measures of outer setting constructs for implementation research: a systematic review and analysis of psychometric quality
Background: Despite their influence, outer setting barriers (e.g., policies, financing) are an infrequent focus of implementation research. The objective of this systematic review was to identify and assess the psychometric properties of measures of outer setting used in behavioral and mental health research. Methods: Data collection involved (a) search string generation, (b) title and abstract screening, (c) full-text review, (d) construct mapping, and (e) measure forward searches. Outer setting constructs were defined using the Consolidated Framework for Implementation Research (CFIR). The search strategy included four relevant constructs separately: (a) cosmopolitanism, (b) external policy and incentives, (c) patient needs and resources, and (d) peer pressure. Information was coded using nine psychometric criteria: (a) internal consistency, (b) convergent validity, (c) discriminant validity, (d) known-groups validity, (e) predictive validity, (f) concurrent validity, (g) structural validity, (h) responsiveness, and (i) norms. Frequencies were calculated to summarize the availability of psychometric information. Information quality was rated using a 5-point scale and a final median score was calculated for each measure. Results: Systematic searches yielded 20 measures: four measures of the general outer setting domain, seven of cosmopolitanism, four of external policy and incentives, four of patient needs and resources, and one measure of peer pressure. Most were subscales within full scales assessing implementation context. Typically, scales or subscales did not have any psychometric information available. Where information was available, the quality was most often rated as â 1-minimalâ or â 2-adequate.â Conclusion: To our knowledge, this is the first systematic review to focus exclusively on measures of outer setting factors used in behavioral and mental health research and comprehensively assess a range of psychometric criteria. The results highlight the limited quantity and quality of measures at this level. Researchers should not assume â one size fits allâ when measuring outer setting constructs. Some outer setting constructs may be more appropriately and efficiently assessed using objective indices or administrative data reflective of the system rather than the individual
Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping
Background: Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct. It built upon a systematic review of the literature and semi-structured stakeholder interviews that generated 47 criteria for pragmatic measures, and aimed to further refine that set of criteria by identifying conceptually distinct categories of the pragmatic measure construct and providing quantitative ratings of the criteria’s clarity and importance.
Methods: Twenty-four stakeholders with expertise in implementation practice completed a concept mapping activity wherein they organized the initial list of 47 criteria into conceptually distinct categories and rated their clarity and importance. Multidimensional scaling, hierarchical cluster analysis, and descriptive statistics were used to analyze the data.
Findings: The 47 criteria were meaningfully grouped into four distinct categories: (1) acceptable, (2) compatible, (3) easy, and (4) useful. Average ratings of clarity and importance at the category and individual criteria level will be presented. Conclusions: This study advances the field of implementation science and practice by providing clear and conceptually distinct domains of the pragmatic measure construct. Next steps will include a Delphi process to develop consensus on the most important criteria and the development of quantifiable pragmatic rating criteria that can be used to assess measures
Quantitative measures of health policy implementation determinants and outcomes: A systematic review
BACKGROUND: Public policy has tremendous impacts on population health. While policy development has been extensively studied, policy implementation research is newer and relies largely on qualitative methods. Quantitative measures are needed to disentangle differential impacts of policy implementation determinants (i.e., barriers and facilitators) and outcomes to ensure intended benefits are realized. Implementation outcomes include acceptability, adoption, appropriateness, compliance/fidelity, feasibility, penetration, sustainability, and costs. This systematic review identified quantitative measures that are used to assess health policy implementation determinants and outcomes and evaluated the quality of these measures.
METHODS: Three frameworks guided the review: Implementation Outcomes Framework (Proctor et al.), Consolidated Framework for Implementation Research (Damschroder et al.), and Policy Implementation Determinants Framework (Bullock et al.). Six databases were searched: Medline, CINAHL Plus, PsycInfo, PAIS, ERIC, and Worldwide Political. Searches were limited to English language, peer-reviewed journal articles published January 1995 to April 2019. Search terms addressed four levels: health, public policy, implementation, and measurement. Empirical studies of public policies addressing physical or behavioral health with quantitative self-report or archival measures of policy implementation with at least two items assessing implementation outcomes or determinants were included. Consensus scoring of the Psychometric and Pragmatic Evidence Rating Scale assessed the quality of measures.
RESULTS: Database searches yielded 8417 non-duplicate studies, with 870 (10.3%) undergoing full-text screening, yielding 66 studies. From the included studies, 70 unique measures were identified to quantitatively assess implementation outcomes and/or determinants. Acceptability, feasibility, appropriateness, and compliance were the most commonly measured implementation outcomes. Common determinants in the identified measures were organizational culture, implementation climate, and readiness for implementation, each aspects of the internal setting. Pragmatic quality ranged from adequate to good, with most measures freely available, brief, and at high school reading level. Few psychometric properties were reported.
CONCLUSIONS: Well-tested quantitative measures of implementation internal settings were under-utilized in policy studies. Further development and testing of external context measures are warranted. This review is intended to stimulate measure development and high-quality assessment of health policy implementation outcomes and determinants to help practitioners and researchers spread evidence-informed policies to improve population health.
REGISTRATION: Not registered
Operationalizing the ‘pragmatic’ measures construct using a stakeholder feedback and a multi-method approach
Abstract
Context
Implementation science measures are rarely used by stakeholders to inform and enhance clinical program change. Little is known about what makes implementation measures pragmatic (i.e., practical) for use in community settings; thus, the present study’s objective was to generate a clinical stakeholder-driven operationalization of a pragmatic measures construct.
Evidence acquisition
The pragmatic measures construct was defined using: 1) a systematic literature review to identify dimensions of the construct using PsycINFO and PubMed databases, and 2) interviews with an international stakeholder panel (N = 7) who were asked about their perspectives of pragmatic measures.
Evidence synthesis
Combined results from the systematic literature review and stakeholder interviews revealed a final list of 47 short statements (e.g., feasible, low cost, brief) describing pragmatic measures, which will allow for the development of a rigorous, stakeholder-driven conceptualization of the pragmatic measures construct.
Conclusions
Results revealed significant overlap between terms related to the pragmatic construct in the existing literature and stakeholder interviews. However, a number of terms were unique to each methodology. This underscores the importance of understanding stakeholder perspectives of criteria measuring the pragmatic construct. These results will be used to inform future phases of the project where stakeholders will determine the relative importance and clarity of each dimension of the pragmatic construct, as well as their priorities for the pragmatic dimensions. Taken together, these results will be incorporated into a pragmatic rating system for existing implementation science measures to support implementation science and practice
An updated protocol for a systematic review of implementation-related measures
Abstract Background Implementation science is the study of strategies used to integrate evidence-based practices into real-world settings (Eccles and Mittman, Implement Sci. 1(1):1, 2006). Central to the identification of replicable, feasible, and effective implementation strategies is the ability to assess the impact of contextual constructs and intervention characteristics that may influence implementation, but several measurement issues make this work quite difficult. For instance, it is unclear which constructs have no measures and which measures have any evidence of psychometric properties like reliability and validity. As part of a larger set of studies to advance implementation science measurement (Lewis et al., Implement Sci. 10:102, 2015), we will complete systematic reviews of measures that map onto the Consolidated Framework for Implementation Research (Damschroder et al., Implement Sci. 4:50, 2009) and the Implementation Outcomes Framework (Proctor et al., Adm Policy Ment Health. 38(2):65-76, 2011), the protocol for which is described in this manuscript. Methods Our primary databases will be PubMed and Embase. Our search strings will be comprised of five levels: (1) the outcome or construct term; (2) terms for measure; (3) terms for evidence-based practice; (4) terms for implementation; and (5) terms for mental health. Two trained research specialists will independently review all titles and abstracts followed by full-text review for inclusion. The research specialists will then conduct measure-forward searches using the “cited by” function to identify all published empirical studies using each measure. The measure and associated publications will be compiled in a packet for data extraction. Data relevant to our Psychometric and Pragmatic Evidence Rating Scale (PAPERS) will be independently extracted and then rated using a worst score counts methodology reflecting “poor” to “excellent” evidence. Discussion We will build a centralized, accessible, searchable repository through which researchers, practitioners, and other stakeholders can identify psychometrically and pragmatically strong measures of implementation contexts, processes, and outcomes. By facilitating the employment of psychometrically and pragmatically strong measures identified through this systematic review, the repository would enhance the cumulativeness, reproducibility, and applicability of research findings in the rapidly growing field of implementation science
Recommended from our members
Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science : Seattle, WA, USA. 24-26 September 2015.
Introduction to the 3rd Biennial Conference of the Society for Implementation Research Collaboration: advancing efficient methodologies through team science and community partnerships Cara Lewis, Doyanne Darnell, Suzanne Kerns, Maria Monroe-DeVita, Sara J. Landes, Aaron R. Lyon, Cameo Stanick, Shannon Dorsey, Jill Locke, Brigid Marriott, Ajeng Puspitasari, Caitlin Dorsey, Karin Hendricks, Andria Pierson, Phil Fizur, Katherine A. Comtois A1: A behavioral economic perspective on adoption, implementation, and sustainment of evidence-based interventions Lawrence A. Palinkas A2: Towards making scale up of evidence-based practices in child welfare systems more efficient and affordable Patricia Chamberlain A3: Mixed method examination of strategic leadership for evidence-based practice implementation Gregory A. Aarons, Amy E. Green, Mark. G. Ehrhart, Elise M. Trott, Cathleen E. Willging A4: Implementing practice change in Federally Qualified Health Centers: Learning from leaders’ experiences Maria E. Fernandez, Nicholas H. Woolf, Shuting (Lily) Liang, Natalia I. Heredia, Michelle Kegler, Betsy Risendal, Andrea Dwyer, Vicki Young, Dayna Campbell, Michelle Carvalho, Yvonne Kellar-Guenther A3: Mixed method examination of strategic leadership for evidence-based practice implementation Gregory A. Aarons, Amy E. Green, Mark. G. Ehrhart, Elise M. Trott, Cathleen E. Willging A4: Implementing practice change in Federally Qualified Health Centers: Learning from leaders’ experiences Maria E. Fernandez, Nicholas H. Woolf, Shuting (Lily) Liang, Natalia I. Heredia, Michelle Kegler, Betsy Risendal, Andrea Dwyer, Vicki Young, Dayna Campbell, Michelle Carvalho, Yvonne Kellar-Guenther A5: Efficient synthesis: Using qualitative comparative analysis and the Consolidated Framework for Implementation Research across diverse studies Laura J. Damschroder, Julie C. Lowery A6: Establishing a veterans engagement group to empower patients and inform Veterans Affairs (VA) health services research Sarah S. Ono, Kathleen F. Carlson, Erika K. Cottrell, Maya E. O’Neil, Travis L. Lovejoy A7: Building patient-practitioner partnerships in community oncology settings to implement behavioral interventions for anxious and depressed cancer survivors Joanna J. Arch, Jill L. Mitchell A8: Tailoring a Cognitive Behavioral Therapy implementation protocol using mixed methods, conjoint analysis, and implementation teams Cara C. Lewis, Brigid R. Marriott, Kelli Scott A9: Wraparound Structured Assessment and Review (WrapSTAR): An efficient, yet comprehensive approach to Wraparound implementation evaluation Jennifer Schurer Coldiron, Eric J. Bruns, Alyssa N. Hook A10: Improving the efficiency of standardized patient assessment of clinician fidelity: A comparison of automated actor-based and manual clinician-based ratings Benjamin C. Graham, Katelin Jordan A11: Measuring fidelity on the cheap Rochelle F. Hanson, Angela Moreland, Benjamin E. Saunders, Heidi S. Resnick A12: Leveraging routine clinical materials to assess fidelity to an evidence-based psychotherapy Shannon Wiltsey Stirman, Cassidy A. Gutner, Jennifer Gamarra, Dawne Vogt, Michael Suvak, Jennifer Schuster Wachen, Katherine Dondanville, Jeffrey S. Yarvis, Jim Mintz, Alan L. Peterson, Elisa V. Borah, Brett T. Litz, Alma Molino, Stacey Young McCaughanPatricia A. Resick A13: The video vignette survey: An efficient process for gathering diverse community opinions to inform an intervention Nancy Pandhi, Nora Jacobson, Neftali Serrano, Armando Hernandez, Elizabeth Zeidler- Schreiter, Natalie Wietfeldt, Zaher Karp A14: Using integrated administrative data to evaluate implementation of a behavioral health and trauma screening for children and youth in foster care Michael D. Pullmann, Barbara Lucenko, Bridget Pavelle, Jacqueline A. Uomoto, Andrea Negrete, Molly Cevasco, Suzanne E. U. Kerns A15: Intermediary organizations as a vehicle to promote efficiency and speed of implementation Robert P. Franks, Christopher Bory A16: Applying the Consolidated Framework for Implementation Research constructs directly to qualitative data: The power of implementation science in action Edward J. Miech, Teresa M. Damush A17: Efficient and effective scaling-up, screening, brief interventions, and referrals to treatment (SBIRT) training: a snowball implementation model Jason Satterfield, Derek Satre, Maria Wamsley, Patrick Yuan, Patricia O’Sullivan A18: Matching models of implementation to system needs and capacities: addressing the human factor Helen Best, Susan Velasquez A19: Agency characteristics that facilitate efficient and successful implementation efforts Miya Barnett, Lauren Brookman-Frazee, Jennifer Regan, Nicole Stadnick, Alison Hamilton, Anna Lau A20: Rapid assessment process: Application to the Prevention and Early Intervention transformation in Los Angeles County Jennifer Regan, Alison Hamilton, Nicole Stadnick, Miya Barnett, Anna Lau, Lauren Brookman-Frazee A21: The development of the Evidence-Based Practice-Concordant Care Assessment: An assessment tool to examine treatment strategies across practices Nicole Stadnick, Anna Lau, Miya Barnett, Jennifer Regan, Scott Roesch, Lauren Brookman-Frazee A22: Refining a compilation of discrete implementation strategies and determining their importance and feasibility Byron J. Powell, Thomas J. Waltz, Matthew J. Chinman, Laura Damschroder, Jeffrey L. Smith, Monica M. Matthieu, Enola K. Proctor, JoAnn E. Kirchner A23: Structuring complex recommendations: Methods and general findings Thomas J. Waltz, Byron J. Powell, Matthew J. Chinman, Laura J. Damschroder, Jeffrey L. Smith, Monica J. Matthieu, Enola K. Proctor, JoAnn E. Kirchner A24: Implementing prolonged exposure for post-traumatic stress disorder in the Department of Veterans Affairs: Expert recommendations from the Expert Recommendations for Implementing Change (ERIC) project Monica M. Matthieu, Craig S. Rosen, Thomas J. Waltz, Byron J. Powell, Matthew J. Chinman, Laura J. Damschroder, Jeffrey L. Smith, Enola K. Proctor, JoAnn E. Kirchner A25: When readiness is a luxury: Co-designing a risk assessment and quality assurance process with violence prevention frontline workers in Seattle, WA Sarah C. Walker, Asia S. Bishop, Mariko Lockhart A26: Implementation potential of structured recidivism risk assessments with justice- involved veterans: Qualitative perspectives from providers Allison L. Rodriguez, Luisa Manfredi, Andrea Nevedal, Joel Rosenthal, Daniel M. Blonigen A27: Developing empirically informed readiness measures for providers and agencies for the Family Check-Up using a mixed methods approach Anne M. Mauricio, Thomas D. Dishion, Jenna Rudo-Stern, Justin D. Smith A28: Pebbles, rocks, and boulders: The implementation of a school-based social engagement intervention for children with autism Jill Locke, Courtney Benjamin Wolk, Colleen Harker, Anne Olsen, Travis Shingledecker, Frances Barg, David Mandell, Rinad S. Beidas A29: Problem Solving Teletherapy (PST.Net): A stakeholder analysis examining the feasibility and acceptability of teletherapy in community based aging services Marissa C. Hansen, Maria P. Aranda, Isabel Torres-Vigil A30: A case of collaborative intervention design eventuating in behavior therapy sustainment and diffusion Bryan Hartzler A31: Implementation of suicide risk prevention in an integrated delivery system: Mental health specialty services Bradley Steinfeld, Tory Gildred, Zandrea Harlin, Fredric Shephard A32: Implementation team, checklist, evaluation, and feedback (ICED): A step-by-step approach to Dialectical Behavior Therapy program implementation Matthew S. Ditty, Andrea Doyle, John A. Bickel III, Katharine Cristaudo A33: The challenges in implementing muliple evidence-based practices in a community mental health setting Dan Fox, Sonia Combs A34: Using electronic health record technology to promote and support evidence-based practice assessment and treatment intervention David H. Lischner A35: Are existing frameworks adequate for measuring implementation outcomes? Results from a new simulation methodology Richard A. Van Dorn, Stephen J. Tueller, Jesse M. Hinde, Georgia T. Karuntzos A36: Taking global local: Evaluating training of Washington State clinicians in a modularized cogntive behavioral therapy approach designed for low-resource settings Maria Monroe-DeVita, Roselyn Peterson, Doyanne Darnell, Lucy Berliner, Shannon Dorsey, Laura K. Murray A37: Attitudes toward evidence-based practices across therapeutic orientations Yevgeny Botanov, Beverly Kikuta, Tianying Chen, Marivi Navarro-Haro, Anthony DuBose, Kathryn E. Korslund, Marsha M. Linehan A38: Predicting the use of an evidence-based intervention for autism in birth-to-three programs Colleen M. Harker, Elizabeth A. Karp, Sarah R. Edmunds, Lisa V. Ibañez, Wendy L. Stone A39: Supervision practices and improved fidelity across evidence-based practices: A literature review Mimi Choy-Brown A40: Beyond symptom tracking: clinician perceptions of a hybrid measurement feedback system for monitoring treatment fidelity and client progress Jack H. Andrews, Benjamin D. Johnides, Estee M. Hausman, Kristin M. Hawley A41: A guideline decision support tool: From creation to implementation Beth Prusaczyk, Alex Ramsey, Ana Baumann, Graham Colditz, Enola K. Proctor A42: Dabblers, bedazzlers, or total makeovers: Clinician modification of a common elements cognitive behavioral therapy approach Rosemary D. Meza, Shannon Dorsey, Shannon Wiltsey-Stirman, Georganna Sedlar, Leah Lucid A43: Characterization of context and its role in implementation: The impact of structure, infrastructure, and metastructure Caitlin Dorsey, Brigid Marriott, Nelson Zounlome, Cara Lewis A44: Effects of consultation method on implementation of cognitive processing therapy for post-traumatic stress disorder Cassidy A. Gutner, Candice M. Monson, Norman Shields, Marta Mastlej, Meredith SH Landy, Jeanine Lane, Shannon Wiltsey Stirman A45: Cross-validation of the Implementation Leadership Scale factor structure in child welfare service organizations Natalie K. Finn, Elisa M. Torres, Mark. G. Ehrhart, Gregory A. Aarons A46: Sustainability of integrated smoking cessation care in Veterans Affairs posttraumatic stress disorder clinics: A qualitative analysis of focus group data from learning collaborative participants Carol A. Malte, Aline Lott, Andrew J. Saxon A47: Key characteristics of effective mental health trainers: The creation of the Measure of Effective Attributes of Trainers (MEAT) Meredith Boyd, Kelli Scott, Cara C. Lewis A48: Coaching to improve teacher implementation of evidence-based practices (EBPs) Jennifer D. Pierce A49: Factors influencing the implementation of peer-led health promotion programs targeting seniors: A literature review Agathe Lorthios-Guilledroit, Lucie Richard, Johanne Filiatrault A50: Developing treatment fidelity rating systems for psychotherapy research: Recommendations and lessons learned Kevin Hallgren, Shirley Crotwell, Rosa Muñoz, Becky Gius, Benjamin Ladd, Barbara McCrady, Elizabeth Epstein A51: Rapid translation of alcohol prevention science John D. Clapp, Danielle E. Ruderman A52: Factors implicated in successful implementation: evidence to inform improved implementation from high and low-income countries Melanie Barwick, Raluca Barac, Stanley Zlotkin, Laila Salim, Marnie Davidson A53: Tracking implementation strategies prospectively: A practical approach Alicia C. Bunger, Byron J. Powell, Hillary A. Robertson A54: Trained but not implementing: the need for effective implementation planning tools Christopher Botsko A55: Evidence, context, and facilitation variables related to implementation of Dialectical Behavior Therapy: Qualitative results from a mixed methods inquiry in the Department of Veterans Affairs Sara J. Landes, Brandy N. Smith, Allison L. Rodriguez, Lindsay R. Trent, Monica M. Matthieu A56: Learning from implementation as usual in children’s mental health Byron J. Powell, Enola K. Proctor A57: Rates and predictors of implementation after Dialectical Behavior Therapy Intensive Training Melanie S. Harned, Marivi Navarro-Haro, Kathryn E. Korslund, Tianying Chen, Anthony DuBose, André Ivanoff, Marsha M. Linehan A58: Socio-contextual determinants of research evidence use in public-youth systems of care Antonio R. Garcia, Minseop Kim, Lawrence A. Palinkas, Lonnie Snowden, John Landsverk A59: Community resource mapping to integrate evidence-based depression treatment in primary care in Brazil: A pilot project Annika C. Sweetland, Maria Jose Fernandes, Edilson Santos, Cristiane Duarte, Afrânio Kritski, Noa Krawczyk, Caitlin Nelligan, Milton L. Wainberg A60: The use of concept mapping to efficiently identify determinants of implementation in the National Institute of Health--President’s Emergent Plan for AIDS Relief Prevention of Mother to Child HIV Transmission Implementation Science Alliance Gregory A. Aarons, David H. Sommerfeld, Benjamin Chi, Echezona Ezeanolue, Rachel Sturke, Lydia Kline, Laura Guay, George Siberry A61: Longitudinal remote consultation for implementing collaborative care for depression Ian M. Bennett, Rinad Beidas, Rachel Gold, Johnny Mao, Diane Powers, Mindy Vredevoogd, Jurgen Unutzer A62: Integrating a peer coach model to support program implementation and ensure long- term sustainability of the Incredible Years in community-based settings Jennifer Schroeder, Lane Volpe, Julie Steffen A63: Efficient sustainability: Existing community based supervisors as evidence-based treatment supports Shannon Dorsey, Michael D Pullmann, Suzanne E. U. Kerns, Nathaniel Jungbluth, Lucy Berliner, Kelly Thompson, Eliza Segell A64: Establishment of a national practice-based implementation network to accelerate adoption of evidence-based and best practices Pearl McGee-Vincent, Nancy Liu, Robyn Walser, Jennifer Runnals, R. Keith Shaw, Sara J. Landes, Craig Rosen, Janet Schmidt, Patrick Calhoun A65: Facilitation as a mechanism of implementation in a practice-based implementation network: Improving care in a Department of Veterans Affairs post-traumatic stress disorder outpatient clinic Ruth L. Varkovitzky, Sara J. Landes A66: The ACT SMART Toolkit: An implementation strategy for community-based organizations providing services to children with autism spectrum disorder Amy Drahota, Jonathan I. Martinez, Brigitte Brikho, Rosemary Meza, Aubyn C. Stahmer, Gregory A. Aarons A67: Supporting Policy In Health with Research: An intervention trial (SPIRIT) - protocol and early findings Anna Williamson A68: From evidence based practice initiatives to infrastructure: Lessons learned from a public behavioral health system’s efforts to promote evidence based practices Ronnie M. Rubin, Byron J. Powell, Matthew O. Hurford, Shawna L. Weaver, Rinad S. Beidas, David S. Mandell, Arthur C. Evans A69: Applying the policy ecology model to Philadelphia’s behavioral health transformation efforts Byron J. Powell, Rinad S. Beidas, Ronnie M. Rubin, Rebecca E. Stewart, Courtney Benjamin Wolk, Samantha L. Matlin, Shawna Weaver, Matthew O. Hurford, Arthur C. Evans, Trevor R. Hadley, David S. Mandell A70: A model for providing methodological expertise to advance dissemination and implementation of health discoveries in Clinical and Translational Science Award institutions Donald R. Gerke, Beth Prusaczyk, Ana Baumann, Ericka M. Lewis, Enola K. Proctor A71: Establishing a research agenda for the Triple P Implementation Framework Jenna McWilliam, Jacquie Brown, Michelle Tucker A72: Cheap and fast, but what is “best?”: Examining implementation outcomes across sites in a state-wide scaled-up evidence-based walking program, Walk With Ease Kathleen P Conte A73: Measurement feedback systems in mental health: Initial review of capabilities and characteristics Aaron R. Lyon, Meredith Boyd, Abigail Melvin, Cara C. Lewis, Freda Liu, Nathaniel Jungbluth A74: A qualitative investigation of case managers’ attitudes toward implementation of a measurement feedback system in a public mental health system for youth Amelia Kotte, Kaitlin A. Hill, Albert C. Mah, Priya A. Korathu-Larson, Janelle R. Au, Sonia Izmirian, Scott Keir, Brad J. Nakamura, Charmaine K. Higa-McMillan A75: Multiple pathways to sustainability: Using Qualitative Comparative Analysis to uncover the necessary and sufficient conditions for successful community-based implementation Brittany Rhoades Cooper, Angie Funaiole, Eleanor Dizon A76: Prescribers’ perspectives on opioids and benzodiazepines and medication alerts to reduce co-prescribing of these medications Eric J. Hawkins, Carol A. Malte, Hildi J. Hagedorn, Douglas Berger, Anissa Frank, Aline Lott, Carol E. Achtmeyer, Anthony J. Mariano, Andrew J. Saxon A77: Adaptation of Coordinated Anxiety Learning and Management for comorbid anxiety and substance use disorders: Delivery of evidence-based treatment for anxiety in addictions treatment centers Kate Wolitzky-Taylor, Richard Rawson, Richard Ries, Peter Roy-Byrne, Michelle Craske A78: Opportunities and challenges of measuring program implementation with online surveys Dena Simmons, Catalina Torrente, Lori Nathanson, Grace Carroll A79: Observational assessment of fidelity to a family-centered prevention program: Effectiveness and efficiency Justin D. Smith, Kimbree Brown, Karina Ramos, Nicole Thornton, Thomas J. Dishion, Elizabeth A. Stormshak, Daniel S. Shaw, Melvin N. Wilson A80: Strategies and challenges in housing first fidelity: A multistate qualitative analysis Mimi Choy-Brown, Emmy Tiderington, Bikki Tran Smith, Deborah K. Padgett A81: Procurement and contracting as an implementation strategy: Getting To Outcomes® contracting Ronnie M. Rubin, Marilyn L. Ray, Abraham Wandersman, Andrea Lamont, Gordon Hannah, Kassandra A. Alia, Matthew O. Hurford, Arthur C. Evans A82: Web-based feedback to aid successful implementation: The interactive Stages of Implementation Completion (SIC)TM tool Lisa Saldana, Holle Schaper, Mark Campbell, Patricia Chamberlain A83: Efficient methodologies for monitoring fidelity in routine implementation: Lessons from the Allentown Social Emotional Learning Initiative Valerie B. Shapiro, B.K. Elizabeth Kim, Jennifer L. Fleming, Paul A. LeBuffe A84: The Society for Implementation Research Collaboration (SIRC) implementation development workshop: Results from a new methodology for enhancing implementation science proposals Sara J. Landes, Cara C. Lewis, Allison L. Rodriguez, Brigid R. Marriott, Katherine Anne Comtois A85: An update on the Society for Implementation Research Collaboration (SIRC) Instrument Review Projec
Beyond factor analysis: Multidimensionality and the Parkinson’s Disease Sleep Scale-Revised
Many studies have sought to describe the relationship between sleep disturbance and cognition in Parkinson’s disease (PD). The Parkinson’s Disease Sleep Scale (PDSS) and its variants (the Parkinson’s disease Sleep Scale-Revised; PDSS-R, and the Parkinson’s Disease Sleep Scale-2; PDSS-2) quantify a range of symptoms impacting sleep in only 15 items. However, data from these scales may be problematic as included items have considerable conceptual breadth, and there may be overlap in the constructs assessed. Multidimensional measurement models, accounting for the tendency for items to measure multiple constructs, may be useful more accurately to model variance than traditional confirmatory factor analysis. In the present study, we tested the hypothesis that a multidimensional model (a bifactor model) is more appropriate than traditional factor analysis for data generated by these types of scales, using data collected using the PDSS-R as an exemplar. 166 participants diagnosed with idiopathic PD participated in this study. Using PDSS-R data, we compared three models: a unidimensional model; a 3-factor model consisting of sub-factors measuring insomnia, motor symptoms and obstructive sleep apnoea (OSA) and REM sleep behaviour disorder (RBD) symptoms; and, a confirmatory bifactor model with both a general factor and the same three sub-factors. Only the confirmatory bifactor model achieved satisfactory model fit, suggesting that PDSS-R data are multidimensional. There were differential associations between factor scores and patient characteristics, suggesting that some PDSS-R items, but not others, are influenced by mood and personality in addition to sleep symptoms. Multidimensional measurement models may also be a helpful tool in the PDSS and the PDSS-2 scales and may improve the sensitivity of these instruments
Measuring progress and projecting attainment on the basis of past trends of the health-related Sustainable Development Goals in 188 countries: an analysis from the Global Burden of Disease Study 2016
The UN’s Sustainable Development Goals (SDGs) are grounded in the global ambition of “leaving no one behind”. Understanding today’s gains and gaps for the health-related SDGs is essential for decision makers as they aim to improve the health of populations. As part of the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016), we measured 37 of the 50 health-related SDG indicators over the period 1990–2016 for 188 countries, and then on the basis of these past trends, we projected indicators to 2030
Global, regional, and national incidence, prevalence, and years lived with disability for 328 diseases and injuries for 195 countries, 1990–2016: a systematic analysis for the Global Burden of Disease Study 2016
As mortality rates decline, life expectancy increases, and populations age, non-fatal outcomes of diseases and injuries are becoming a larger component of the global burden of disease. The Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) provides a comprehensive assessment of prevalence, incidence, and years lived with disability (YLDs) for 328 causes in 195 countries and territories from 1990 to 2016